Lecture 2 : Source coding , Conditional Entropy , Mutual Information

نویسنده

  • David Witmer
چکیده

In some cases,the Shannon code does not perform optimally. Consider a Bernoulli random variable X with parameter 0.0001. An optimal encoding requires only one bit to encode the value of X. The Shannon code would encode 0 by 1 bit and encode 1 by log 104 bits. This is good on average but bad in the worst case. We can also compare the Shannon code to the Huffman code. The Huffman code always has shorter expected length, but there are examples for which a single value is encoded with more bits by a Huffman code than it is by a Shannon code. Consider a random variable X that takes values a, b, c, and d with probabilities 1/3, 1/3, 1/4, and 1/12, respectively. A Shannon code would encode a, b, c, and d with 2, 2, 2, and 4 bits, respectively. On the other hand, there is an optimal Huffman code encoding a, b, c, and d with 1, 2, 3, and 3 bits respectively. Note that c is encoded with more bits in the Huffman code than it is in the Shannon code, but the Huffman code has shorter expected length. Also note that the optimal code is not unique: We could also encode all values with 2 bits to get the same expected length.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quantum Information Chapter 10. Quantum Shannon Theory

Contents 10 Quantum Shannon Theory 1 10.1 Shannon for Dummies 2 10.1.1 Shannon entropy and data compression 2 10.1.2 Joint typicality, conditional entropy, and mutual information 6 10.1.3 Distributed source coding 8 10.1.4 The noisy channel coding theorem 9 10.2 Von Neumann Entropy 16 10.2.1 Mathematical properties of H(ρ) 18 10.2.2 Mixing, measurement, and entropy 20 10.2.3 Strong subadditivit...

متن کامل

The Mathematical Theory of Information , and Applications ( Version 2 . 0 )

These lecture notes introduce some basic concepts from Shannon’s information theory, such as (conditional) Shannon entropy, mutual information, and Rényi entropy, as well as a number of basic results involving these notions. Subsequently, well-known bounds on perfectly secure encryption, source coding (i.e. data compression), and reliable communication over unreliable channels are discussed. We...

متن کامل

Universal noiseless coding

A&ruct-Universal coding is any asymptotically opt imum method of block-to-block memoryless source coding for sources with unknown parameters. This paper considers noiseless coding for such sources, primarily in terms of variable-length coding, with performance measured as a function of the coding redundancy relative to the per-letter conditional source entropy given the unknown parameter. It is...

متن کامل

Coding Theorems for Secret - Key Authentication Systems ∗

This paper provides the Shannon theoretic coding theorems on the success probabilities of the impersonation attack and the substitution attack against secret-key authentication systems. Though there are many studies that develop lower bounds on the success probabilities, their tight upper bounds are rarely discussed. This paper characterizes the tight upper bounds in an extended secret-key auth...

متن کامل

On Measures of Entropy and Information

2 Mutual information 3 Mutual information . . . . . . . . . . . . . . . . 3 Multivariate mutual information . . . . . . . . . 3 Interaction information . . . . . . . . . . . . . . 4 Conditional mutual information . . . . . . . . . 5 Binding information . . . . . . . . . . . . . . . . 6 Residual entropy . . . . . . . . . . . . . . . . . . 6 Total correlation . . . . . . . . . . . . . . . . . . 6...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013